# Knowledge-Enhanced Pre-training
Roberta Large Ernie2 Skep En
SKEP (Sentiment Knowledge Enhanced Pre-training) was proposed by Baidu in 2020, specifically designed for sentiment analysis tasks. The model incorporates multi-type knowledge through sentiment masking techniques and three sentiment pre-training objectives.
Large Language Model
Transformers English

R
Yaxin
29
2
Bert Election2020 Twitter Stance Biden KE MLM
Gpl-3.0
This is a pre-trained language model based on the BERT-base architecture, specifically optimized for detecting stances on Joe Biden in tweets during the 2020 US election.
Text Classification English
B
kornosk
69
4
Featured Recommended AI Models